I sometimes speak at tech conferences, staff retreats, universities, and bootcamps. My talks tend to be:
Didactic: I teach a topic. I draw examples from my career, but the talk does not center me or my personal experience. I don’t talk about my journey into software development, for example.
Technical: I focus on things I think engineers should know (though I make sure to include something for designers and product folks, too). The examples come from application or machine learning development projects. The slides may have code samples.
Language and version agnostic: I stick to principles and approaches that remain valuable across languages, frameworks, and versions. I don’t talk about what’s new in the latest Android SDK release, for example.
So far, you can catch me in 2020 at:
ORD Camp (Jan 23-25, Chicago)—Organizing
GOTO Meetup (March 24, Chicago)—Panel on Developing Intercultural Vision
deliver:Agile 2020 (April 29-May 1, Columbus)—Workshop on Advanced Debugging
BlueTeamCon (Jun 20-21, Chicago)—Talk on Risk-Oriented Testing
PromptConf (Sometime in September or October, Chicago)—Organizing
Talks I Currently Give:
How to Level Up as a Technologist
Length: 35-45 Minutes
COUNTRY Financial Internal DevOps Conference, 2018
PearConf Meetup, 8th Light, Chicago, 2019
To thrive as a technologist, you need to constantly level up your skill set.
That sounds daunting: after all, there’s so much to learn. You might have even experienced some false starts in the past where you tried to learn a new skill and it didn’t work out.
It’s not because you can’t. In fact, I’m confident that you already have the innateability to addbreadth and depth to your skill sets.
I know that because I know that you use that ability every day to stay current as a technologist.
Leveling up is itself a skill that you can sharpen. Today we’ll talk about some techniques that you can use to get better at leveling up. These techniques will help you translate your innateability to learn so you can broaden and deepen your skill set more effectively, and even enjoy doing it!
Here’s an abridged (22 minute) version of the talk that I gave at a meetup as a favor to a friend:
The Technology and Psychology of Refactoring
Length: 45-60 Minutes
PearConf 2019, Center on Halsted, Chicago, 2019
PearConf Distributed Lecture Series, 2019
When the requirements change out from under your tech team, your code has to change. So it’s worthwhile to build your skills in assessing code maintainability, deciding whether to refactor, and doing the refactor.
In this talk, we’ll answer questions like:
What does it mean for code to be maintainable, and how do we make code more maintainable?
How do we know when to refactor—and how do we know when to stop refactoring?
How do we sell stakeholders on giving us space to make a large refactor?
This talk includes both code samples and architecture samples from apps in use today.
Here’s an abridged (22 minute) version of the talk that I gave at a lunch and learn, again, as a favor to a friend:
Under special circumstances, I can build and deliver a custom talk.
Right now, for each of the above talks, I’d love to get a clean video/audio recording. If your event can do that, let’s chat.
These are proposals I have submitted to a conference in the past year. If it sounds like a talk you’d like to see at your conference, feel free to reach out. If it sounds like a talk you’d like to see submitted to your conference, by all means, copy the whole abstract and paste it into your CFP. Just email me to let me know you did it, please 🙂
Advanced Debugging: Strategies and Tactics
Length: 90 Minutes
Debugging: we spend most of our programming time doing it.
But we neglect it as a skill. We say it’s important, but we don’t deliberately practice it. Instead, we focus on shiny new languages, frameworks, and features. Our debugging skills lag behind the development of our other skills. And since we still spend so much time debugging, programming stays more frustrating for us than it needs to be.
Here’s how we make debugging hard for ourselves: we try to apply our assumptions from feature development to debugging, where they’re not true.
Case in point: we work on features with the starting assumption that we understand our code. But when we’re trying to track down a bug, that’s not accurate. We’re drawn to practices that align with our inaccurate assumption (like trying to fix it on the fly) but end up adding frustration to the debugging process. As we gain experience, we learn to ignore these temptations and do something else (like slow down and print out some variables), even if we don’t fully understand why.
But if we instead approach debugging from the starting perspective that we do not understand our code, we’re drawn to practices that *work* for debugging, rather than forcing ourselves to learn from experience to do the opposite of what we want to do based on an inaccurate assumption.
In this workshop, you will learn:
Strategies to systematically track down a bug (and why our usual approach so often fails)
Tactics to gather information about your systems, so you can narrow down the causes of bugs faster
Practices that you can use to sharpen your debugging skills over time
You can expect some lecturing, complete with illustrations and anecdotes to help summarize the concepts and bring them to life. We will also practice our new skills by working through debugging exercises.
This session is targeted at engineers who work as individual contributors on code. I am confident that everyone from the junior level to the principal engineer can learn something from this workshop. If you don’t believe me, come see me beforehand and we’ll make a bet.
Detecting Context, from Activism to Software Architecture
Length: 30 Minutes
How can a teacher create a valuable classroom experience for students from different backgrounds?
Why do our workplaces make us feel burnout and emotional exhaustion?
Why do books about object-oriented programming prefer composition to inheritance?
These questions seem unrelated, but they share something: assumed context. What is assumed context? It’s the details of a situation that we accidentally assume to be true in ALL situations.
For example, in teaching, we often default to reading and lecturing as the primary or only means of information transfer. When we understand where we got that model of education (white supremacist-built instructional institutions), we know where to look for alternatives that make our teaching more effective. When we accept advice about how to teach and learn, it’s very important for us to know “who is this advice from, and for?” Because it’s probably not everyone.
In the workplace, we assume it’s not professional to be angry or sad at work. When we understand where that comes from (a white supremacist understanding of “acceptable” behavior) and what it costs us (forcing people over time to cope by learning not to care about their work), we realize that we need to look to other models to produce great work, and we get clues about where to look. When we accept advice about how workplaces should be, it’s very important for us to know “who is this advice from, and for?” Because it’s probably not everyone.
In object-oriented programming, many resources universally tout composition as superior to inheritance. Why? Mobile frameworks use inheritance all the time. Try writing a mobile framework without it, and discover real fast what circumstances make inheritance suddenly seem like a great idea. It turns out that the assumed context of most OOP books is “in the context of writing an end-user application on top of an existing framework.” In that CASE composition makes sense relative to inheritance MORE than in other contexts like building a framework or writing a programming language. And when we accept advice about how to write our code, it’s very important for us to know “who is this advice from, and for?” Because it’s probably not everyone.
These are just examples of how assumed contexts, that activist work have taught me to recognize, have translated for me to a more nuanced understanding of programming and technical practices.
In this talk, you’ll learn how to root out assumed context, too. You’ll learn to challenge assumptions in common programming advice, solicit new perspectives, and give yourself options when you previously thought you had none.
Talks I Have Retired:
Allyship in Times of Crisis
Length: 20 Minutes
Pivotal Labs Employee Professional Development Series, 2016
This talk is for allies who want to take care of marginalized communities affected by traumatic events. I gave it shortly after the Pulse shooting in Orlando, but the principles apply in many crisis situations.
In the event of a tragedy like this, we need allies to step up. It can be difficult to know what to say or do if you are not a part of the affected community. That’s what this talk is for: it’s a starting point for allies.
We start with some terminology and talk about what we mean by terms like target, ally, bystander, and crisis. Then we discuss the grief and fear that prevail within a target community after a crisis, and where allies can start to help with that.
Finally, we relate the discussion back to what an ally can do on a daily basis to help fight for equality—and how social change happens.
Full Video Available At:
Chelsea, come give this talk at my meetup!
I only have, and only will, give this talk once. I didn’t even rehearse it.
That’s why I asked Elliot, the best videographer I know, to record it: I knew, if the video or audio recording failed, the talk would be lost forever.
As you can see, Elliot pulled through and got a full recording. So if you want this talk at your meetup, you’re welcome to play the video.
Speaker Bio and Headshot:
Chelsea writes code for money. Her recent projects include an automated image processing pipeline for NASA Landsat images and a dual-platform mobile app to engage science enthusiasts in original research. She looks for clients who are saving the planet, advancing basic scientific research, or providing resources to underserved communities. She has been known to take projects in mobile development, web development, and machine learning. She streams some programming sessions to YouTube, so you can watch her code (and narrate!) in real time. She then turns the recordings into educational materials.
Chelsea also teaches Mobile Software Development at the Master’s Program in Computer Science at the University of Chicago. She is the author of chelseatroy.com and a book called Remote Work Sucks (the title is kind of a trap). She organizes two conferences: PromptConf (Chicago area, very technical) and ORD Camp (Chicago area, not nearly as technical).
Chelsea flings barbells around for fun. She drives an electric cafe cruiser named Gigi.