Ten years ago, my undergraduate honors project would have been a graduate thesis. My project used 774.87 CPU days (plus a lot more after graduation as I ran some more molecular dynamics simulations to bulk up the data), which means the equivalent amount of computation on a single-processor computer would have taken 2 years, 1 month, and 13 days. And that doesn’t even include analysis time.
We hear a lot about how “technology is advancing fast” with a lot of filler words that generally mean “advancing” and “fast.” That’s all well and good, but realistically, what does it mean? It means that the knowledge high school and college students are amassing today will be old news in a couple years. It means I grew up around computers and learned how to type when I was in elementary school and played Power Pete on our school iMacs, but kids today are growing up around tablet devices and learn how to navigate through pages of apps to find Angry Birds. It means that the structure of DNA wasn’t even known when my dad was born, yet genetic screening was a routine analysis on my embryonic self.
In the time between my last genetics class at Stanford (winter quarter 2011) and now, probably hundreds of genetic associations have been discovered. Researchers have a deeper understanding of the mechanisms of important diseases, and I don’t even have an estimate of how many drugs are being developed to cure various conditions. I still know my way around the commonly used tools for exploring and navigating the human genome (some call-outs for dbSNP, SNPedia, OMIM, PharmGKB, and GeneReviews!), but if I weren’t working for a DTC genetics company, I’d be out of practice now and way out of the loop in a year.
So here’s a question: how do I stay in the loop?
Rather, how on earth do you expect me to stay in the loop?
With technology changing so fast and our understanding of the world growing more and more advanced, how are we supposed to keep track of it all? How do we keep ourselves from becoming outdated? Even as I head towards my future in medicine, I do stop and consider for a moment. Right now, my understanding of genetics is more advanced than the average physician, and possibly more advanced than the average genetic counselor (most of whom have earned this certification recently). Take the following examples:
- I was working with a genetic counselor who kept insisting that BRCA mutations indicated a diagnosis of breast cancer. Since your genome remains (relatively) unchanged over your lifetime, if the above statement were true, BRCA mutation carriers would be diagnosed with breast cancer at birth. Luckily not the case.
- One of my coworkers recently dealt with an angry physician who demanded that we test his daughter’s Y-chromosome. Females don’t have a Y-chromosome — their lack thereof is what makes them female. (When my coworker tried to explain this, he yelled at her for being condescending.)
- Yesterday I explained to a genetic counselor that DNA has two strands that complement each other. Very exciting!
I fully acknowledge that these people were probably well-educated, passed crucial exams, and are certified to treat patients wherever they are. But maybe the sheer fact that I graduated with my B.S. in Molecular and Cell Biology in 2011 makes me more knowledgeable on these matters than they are. So where am I going to be in twenty or thirty years? Am I fated to become obsolete, too?
Here’s the deal. I’d say my fellow premed biology majors from Stanford all have the foundation of knowledge that’s necessary for understanding how inheritance, genes, molecular biology, and physical conditions and traits are all linked together. I personally take that foundation for granted sometimes. But that foundation is what I’ll continue to need to keep myself from getting left behind as technology advances. The same way my keyboard skills begot my iPhone texting skills and my HTML skills begot my PHP skills begot my Java skills begot my C++ skills begot my Python skills*, I’ll have to learn to pick up new knowledge, understanding, and technology.
Isn’t that really what we should be taught? Some basic knowledge, yes, but ultimately we should be learning the problem solving process and framework we need to solve or understand any kind of situation that’s thrown our way. Keep that in mind the next time a project or a test seems impossible — you’re learning to learn effectively, and what skill is more important than that?
* Full disclosure, I don’t really count anything past PHP as actual “skill,” just capability. And let’s be real, my C++ and Python “capability” shouldn’t even be called that. But that’s kind of the point — it’s all moderately transferable on the same framework of understanding.