A lot on my mind lately. Figuring out my career and life foremost among them.
I’ve been guest writing more. I had a post at the Research Whisperer a few weeks ago that seemed to do well about building a portfolio career and using that to try to transition into a new job. Partly gaining experience.
I did some guest science writing too, both for UK based websites/publications. One was a collaboration with my PI, and then other was for the UK Plant Sciences Federation on flowering time. I even emailed a flowering time scientist to get some quotes. That is pushing my comfort zone.
People have been passing job ads and opportunities along to as well, which is incredible and part of why I am so grateful to platforms like Twitter. Which brings me to the #seriousacademic hashtag after The Guardian posted a short piece from a grad student that could not see the value of social media and how it distracted from the real world in front of people as well as taking away focus from actual academic research.
As much as I love Twitter, I never tell anyone they have to be on it. I also legitimize most uses of the platform…I suggest people start out just by listening in/following things they are interested in and checking in once in awhile. Finding things serendipitously can be great sometimes. And if you feel like responding/joining a discussion, then great.
My community is almost entirely online…I would love to have a more consistent real world community of people I see regularly, but that is part of why I need a new job in a new place, something new. I tried being a serious academic. After years of trying, I’ve concluded I’d rather be a serious something else– ideally in the writing/editing world where I can draw on my scientific skills as well.
Twitter has been great for me to get my blog(s) out to the world…for those interested in plant science and my writing about mental health here. My goal has been to be a one person broader impact for the plant science community– Twitter is my way of giving back and it has fed back into my science in great ways too. I consider it education/outreach, though I also am writing about things I find interesting or am curious about. I’ve made genuine personal and professional connections because of Twitter. I hope I’ve contributed something and not just taken away.
I’d tell the “serious academic” grad student that building a network takes time, and if it’s all an in real life/email chain of networking and that works for them, then awesome. No social media needed. However, I think social media has made me a better scientist. It’s instilled a love of learning that I had lost. It’s opened my eyes to some things, like inclusion/diversity. I really want to learn new things and do better science, and live up to the amazing things I hear about people doing on Twitter every day.
I try to be a supportive ear and celebrator of successes and pitch in when opportunities arise to do something specific that I can do (organizing a conference panel for instance). Or being a digital media coordinator for the conference I attend most years. Trying to stay on top of Twitter activity at a >1,000 person conference is hard, and I do think is valuable as a record of the conference. Twitter is a good way for me to take notes and to listen to a talk as well, but there is definitely a balance to be struck with attention and tweeting– however, Twitter really shines as a 6th sense at conferences and as a networking tool. More people visit posters that presenters tweet about.
That said, lately, I’ve felt really exhausted. Everything seems to take gargantuan effort and little feels light anymore. Some of that is taking on more ambitious projects, and trying to make things better than I’ve done before. Some, though, I fear is feeling burned out with all the extracurricular things I’ve been doing to try and figure out what’s next. Maybe I’m doing it all wrong? It’s hard for me to know.
Last, Serious academic reminded me of this essay by Sarah Cooper on Medium about why taking your ideas seriously is important. Like her, I didn’t take my ideas seriously for years. Starting my blogs, engaging on Twitter, discussing real things there, has gotten me to take my ideas seriously. However, I don’t take myself too seriously and do have fun on Twitter too. Twitter is great for having fun– that is part of how serious communities are built.
Twitter has gotten me connected to people and I’m not sure that would have happened in real life in the last few years. It has, in many ways, saved my life. Are there plenty of people that can live without it? I’m sure there are. Even I need breaks sometimes. And having built my community online that has translated into the real world in many ways and I feel a lot better taking those social media breaks.
A friend of mine took me out to see the sunset the other night.
It was a gorgeous night. But I was distracted. Not really present. Thoughts kept interfering. I have things to write. Stuff to learn, like coding and R stats. Things along these lines (superimposed on actual images of the sunset):
When people talk about the all consuming nature of working in science, this is what it looks like. Time away feels wasted. There’s never enough time spent. And especially as I’m trying to transition to a new career this year, down time feels like an unaffordable luxury. That even taking care of myself is impermissible too (and that one goes beyond just the sunset…I resent having to take time to go to the Dr.).
And it’s not as if I am exactly enjoying work either. I still feel burned out a lot of the time. Still, after a few hours, and eating something, this time lapse my friend Holly Pierce took is pretty incredible:
I know time away is important, but it’s hard to feel that it’s OK to take time away until I get my life more settled. I hope that happens, but it’s still hard to see how it happens. I hope there’s a time when I don’t feel like I’m burning my candle at both ends.
Biochem Belle has been chronicling her career path from academia to ‘not academia’ in a series of blog posts. Part 3 is particularly about the transition point, one that sounds like it took awhile to get to. Dr. 24 hours also wrote about figuring out a career by a sort of ‘faking it til you make it’ approach that kind of runs anathema to academia. He wonders why academics have can’t just do that.
This is on my mind…as a pre-transition phase academic, still trying to figure out what direction works for me…not just careerwise, but in life.
Why do academics have such trouble transitioning? Training? Lack of skills? I don’t quite think it’s either of those things. There’s a mindset that gets cultivated in academia. Some of it is due to a narrow devotion to a task and being in a culture that sends the message that the tenure track is THE path, nothing else. But that is starting to fade away as awareness spreads about the problems in academia and the fact that the tenure track is a minority employer of PhDs now.
I can’t speak for everyone else, but the academic mindset in me has given me a narrow set of operating parameters. There’s a need for evidence that we’re more flexible that isn’t well demonstrated, even if it’s true. I’m a blogger…so I guess I have that going for me as ‘flexibility’ goes. There are altruistic reasons people get into academia. It’s knowledge generation, it’s solving puzzles and answering questions, it’s doing something for the long-term. While industry and companies in the private sector might have some of those things, the sense I have is they’re all about a short term gain. There’s nothing wrong with that necessarily, it’s just that many academics are geared to be the shoulders of the giant that future researchers will rest upon. It’s a different mindset. And some academics may fear the ‘fast pace’ they hear about in the private sector– the fear being that deep thought does not occur in those places (it probably does on some level, but decisions are made with far less than complete info).
There’s also the mindset that career searching, networking, and other experiments like it aren’t very hypothesis driven. I think there’s a generation of scientists (me among them) that really think the only acceptable kind of work is hypothesis driven. Observational studies are a joke and anything open ended, the ‘fishing expedition’ (aka mutant screens), are bad. Of course, good science can come out of all of these things and all have their limitations/advantages, but the hypothesis driven questions currently reign supreme. And career searches and transitions, dating, the stuff of life may have testable hypotheses, but are far from well controlled experiments. We’re taught to take some risks with experiments, but even then there’s a need for proper controls, clean environments, etc. We’re taught to try to control for any chaos to measure what we need to measure. We want elegant experiments (think Pasteur’s flask experiment), but those aren’t easy to do with personal exploration or life.
Of course, scientists are also people…and I know I’ve been in my own particular ‘work is my life’ bubble for so long that it’s hard to know if I’m a complete freak/weirdo. Most scientists I know have significant others, some have kids, and they get their science done. Those ‘normal’ people do seem more able to handle transition. I think there’s a sense with PhDs and postdocs that we need permission to do anything from PIs, committees, or other mentors guiding our work. While that is true in other professions as well, the power differential between postdocs and PIs is (or seems to be) greater than that between most boss/employee relationships.
And last, every academic I have met has some degree of obsessiveness about them. A perfectionist (in the bad sense) streak that can induce analysis paralysis and make taking action, especially different and uncertain actions, harder to take. Too many academics probably have the so-called fixed mindset (as opposed to the much healthier growth mindset).
I don’t fully know what holds me back, and I am moving again. Learning, trying new things (still in this experimental phase, I feel), but still pre-occupied with science. There’s also the ‘how do you know when you’re done?’ in academia…there’s always more that can be dug into (including in this post, I’m sure).
I think there’s an Einstein quote that says something like ‘you can’t get to a new place with thinking that got you where you are now’…so the challenge for the new career path seeking academic is partly one of trying to think differently about a lot of things. And that isn’t easy, but is possible. We are creative, intelligent, sometimes funny, and odd thinkers. Thinking differently and changing our minds is what we’re trained to do.
A few weeks ago, i was talking with some tweeps about learning in the lab as a Ph.D. student, how to learn to use shared department resources like confocal microscopes and qRT-PCR machines, any commonly used equipment or how to learn a new technique period. The way this is done now often seems to have all sorts of problems and shortcomings. How do you design a training system for trainees that are all in different places in their level of knowledge?
Confocals are complex; and the software + hardware combination allows for all sorts of possibilities, and potential for things to go wrong, especially with the objectives on the microscope. While it’s unlikely anyone needs to know everything about every function possible, it’s hard to tailor education to each student. I’m sure there are all sorts of online resources now for learning a lot of these things, but it’s always hard to know where to go. do companies have ‘virtual confocals’ now where you can play around/simulate what would happen with various functions/what the output images look like. In our department, we have a fantastic resource in our research support specialist. She manages all the common equipment and knows a lot about all of it and everyone is required to sit down with her for an introductory session on anything we want to use regularly. This is good and useful as far as it goes, but isn’t quite sufficient in some ways. One session is often not quite enough (at least for me…it is enough to learn how not to break something, and maybe that’s the point…the rest is up to us to thinker on our own). And that’s sort of fine as an adult scientist; guide your own learning, etc. It’s what we’re supposed to do. Some departments don’t even have the basics of this training in place though (or it’s the non-active learning form of training with someone just talking in front of a room).
I’m trying to learn to code and learn my statistics better as well. I’m going to take a MOOC on it this term. And I dabble in learning to code as well. It’s up to me, and that’s fine. These things seem to get pushed to the edge, fit into spare time, taken away from life. it’s important to make time to learn new things, and yet the culture of academia seems to make it a fringe activity, not a core function. Asking people for help is tricky as we’re all busy. Or asking for feedback…it seems to be secondary to getting things done too much of the time. Some of this gets at what Lenny Teytlmann writes about; the need for improved training of PhDs and postdocs. For both research and non-research careers alike. It’s something that can easily go by the wayside. Even when we’re acting as our own mentors.
I know I’ve written in the past about how I still have a hard time asking for help or feedback, and it’s something I’ve worked a lot on. I am slowly getting better, but have noticed that the culture of academia and science almost runs counter to that.
The current postdoc situation.
The Future of Research Symposium report from a group of enterprising postdocs really does address some of these problems with training and the perverse incentives in the system right now. It really resonates with me.
And in Science Careers this week, Beryl Lief Benderly wrote about the recent National Academies Report on the Postdoc Experience in her Taken for Granted column. It’s not a sunny report. It ends with this:
I feel terrible for the cohort that’s been caught” in the current crunch. It may be too late to help them, but if the academic science community can reach the conclusions implicit in the report and make the appropriate changes, future generations of young scientists may have much smoother and less painful transitions to satisfying and productive careers.
As one of those ‘too late to help’, it really makes me feel like I sucker for taking a fools bet. I’ve written before about how if young scientists aren’t enthusiastic about their work, they really can’t recommend it and instill it in the next generation as easily. Science is amazing, but it does not come before having a life. Something too many Postdocs of my generation that fell squarely in an awareness gap of what academic training meant and ought to be. Of course, it’s hard to know how to pivot (especially when it all seems like it’s down to pure luck). I’ve also dealt with depression which really stopped me in my tracks for awhile. I am really just now getting going again.
I’ve started some new projects on my own. I am doing a tumblr blog inspired by plant science where plants give advice to people. I know it’s not what the ASPB quite had in mind when they set up the hashtag,but it sparked the idea. And I started a new blog, The Quiet Branches where I’m going to attempt to be like the great science communicators I see on Twitter through writing, a skill I’ve really tried to cultivate.
All of this is by way of saying I am growing, learning, trying to push out of the box I’ve been in with new thinking, trying new things, and basically doing at least some of those ‘take charge of your own career’ ideas people always say to do. I don’t know where I’ll end up. I don’t think I want to be an academic. Something in research communication might suit me well as teaching in any form is something I have a deep desire to do.
I hope future generations of scientists aren’t stuck like postdocs in my generation. My next post here will be on ‘the overtaxed expert’…we’re expected to know so much and yet now, with all the information out there, no one person can possibly process it all.
I’ve spent some time thinking about what I’ve built over the last few years as I have made my way out from someone that wanted to just leave the world to someone who wants to contribute in real ways, in positive ways (don’t we all?), and meaningful ways.
Coming out of the dark and into a world of wonder can be complicated. Being flat and feeling divorced from connecting to the world to being vital, more engaged, can be a scary process. I realize just how much I’ve missed out on, not going deep into any particular subject because I didn’t feel much in whatever I engaged in. I’ve written before about just what depression takes away from learning and it’s hard to describe since plenty of successful people have depression (perhaps they succeed despite it), and I can still read and write (perhaps not well, but it is something I work on) and do basic math. I feel I can learn things. But I have tended to lack an emotional connection to something that can boost learning. Depression feeds into the fixed mindset as well, rather than a growth mindset too— with constant rumination and the voice that says ‘who do you think you are? You’re nothing, no one, and don’t matter’.
This blog has really documented that process for me. I hope I’ve been building a platform on which to build even better and greater things. Beth Buelow an entrepreneur, coach, and introvert in her really good book talks about an image series she got of the Eiffel tower being constructed. They built the base quickly, and then progress appeared to stop for a long while before the tower was completed. During that apparently fallow time, the construction workers were doing a lot of reinforcement of the structure, adding rivets and doing the preparatory work to build the tower. Building a strong base to create what was one of the tallest structures in the world at that time that persists to this day.
I hope I’ve been building that kind of base. That I’ve gotten better in some key ways to start the next phase, to really get out into the world visibly for the world to come and see. I do need reminders of how habit change can be most effective like this from James Clear. And it helps to be reminded to surround yourself with people that help you be your best. Though I find myself overdosing on ‘Lifehacking’ lately (it can be great for ideas, but easy to overdo it or to be constantly trying new things). I’ve built up a system that kind of works, I think, that’s healthy for me. And now I need to mold it into output that helps me grow more and gets me out into the world, being mindfully productive.
And as James Clear points out, prioritizing matters, and taken further, and perhaps scarier/harder is the idea between finding the distinction between should/must and choosing the latter. And continuing to learn, grow, and retain new knowledge/experience through a system that works and is evolving. And that also means being able to make decisions more rapidly than I do now, and act on them and being guided by what is truly important to me.
What is essential?
I’m going to write an ambition of mine: I want to be a science writer in some way, shape or form. I love transmitting knowledge between minds. It seems to drive a lot of the decisions I make. It’s something that is more important to me than the research I do now. It’s an ambition that’s scary, but also seems deep-seated. I love science. I love writing, art, and popular culture. I love learning and teaching/communicating. Maybe it’s because I’ve listened to one to many podcasts and read one to many amazing writings about science that I’ve gone out of my mind, but why do I gravitate towards those things in the first place? And how to get from where I am now to a new place? That’s not easy to answer.
Being a scientist now means having to wear a lot of hats, being seen as competent and amazing at many things that Ben Lillie (partially) listed, including having a public face to engage with non-scientists. It seems like people are expected to do more and more every year, to sacrifice our lives for our work, to produce ever more value. And whatever we do has to be quantified and standardized, even if that’s not the best or is too narrow a measure.
With the digital tools most of us have access to, we are expected to do everything ourselves, to produce more, always learn things flawlessly, and basically be perfect. And yet, that is unrealistic for any individual human. Not all of us are skilled at everything, but the 21st century world seems to demand that in an era of impatient teaching and exclusion if you’re not in the ‘in’ crowd from early on. And there is infinitely more to learn. And of course, digital tools allow for tracking of productivity more than ever.
Many circumstances can keep us from trying things that we’re truly suited to do. There’s a story Mark Twain tells (attributed to him, anyhow. I can’t find a source) talking about a man seeking the world’s greatest general only to die and go to heaven to find that a cobbler would have been the greatest if given the opportunity. Did he just live at a time with no war or was it that there was a crucial moment where he didn’t take a leap into the military life? If it’s the latter, hopefully there’s still time for me to make a leap. Maybe by not having an alternative, it’s possible.
Coding is something I am just starting to dabble in…and we’re all told it is the essential skill of the 21st century. I don’t know if that’s the case, but it certainly seems handy to any citizen of the Internet where many of us spend out time. And if not having a full understanding, at least knowing some of the theory behind the gorgeous websites we see each day is important. And it’s important to know that the people who build them are not perfect either; and often have biases/problems. And I don’t think this idea applies to just coding. To be in demand seems to mean being good at all the things and not needing a learning curve. Of course, that might be my warped perfectionist perception speaking.
A lot of science news is dedicated to reporting how we might all live better, parent better, be healthier, do more for the environment, and basically be better people if only we’d all behave, spend money, or act differently. Only that is vastly unrealistic. And the recommendations often wrong because of flawed science. Science really is the last word on nothing.
What can we get wrong?
Phil Plait, in a post on his Slate blog, wrote about response to a picture he tweeted about actresses that have a passion for science (great!). The problem comes with Mayim Bialik (w/ a Ph.D. in neuroscience) and her anti-vaccination views; which are scientifically indefensible as this NPR story on a documentary about the effects of not eradicating polio demonstrates. Keith Kloor addresses this with Dr. Oz and similar and perhaps not as dangerous are Bill Nye’s anti-GMO views; if only because Nye, an engineer, does not have as informed views about biology and doesn’t seem to be strongly anti-GMO as yet, just highly skeptical. He could change his mine yet. Bialik and Dr. Oz must know better/be more familiar with life sciences and medicine.
The process of robust science dictates that any ideas or technologies supported by science (e.g. climate science, gravity, evolution, smart phones, vaccines, current GMOS) are in fact safe, work, and that is the final word (of course, each product needs to be taken on a case-by-case basis). Selective application is not acceptable. There are areas of science that are still debated and the above ideas continue to be investigated and tested by science to test new methods of delivery, to explain parts of these ideas we don’t know the answers to yet, or to improve them in some way (or create vaccines to viruses we don’t have vaccines for as yet). And of course, scientists are never absolutely certain; we’re taught to critically examine our ideas and design experiments/seek data that challenge our ideas (that may happen less in an era of hyper-competition, tight funding).
In today’s world, it really appears unacceptable, especially as a public figure/celebrity to say ‘I don’t know’ when pressed about some question that’s out there in the world (uncertainty being a perceived sign of weakness?! I would argue that it’s the opposite). I am not a psychologist, social scientist, or neuroscientist, only a sufferer of depression and anxiety who has learned what I can about them and write about my own solutions (some scientifically grounded, others likely less so). I’ve tried to strike a voice of not barfing rainbows magical positivity, but of grounded optimism. I routinely say that I do not know, and feel uncertain about most things and this can be paralyzing. Who would do anything given the potential repercussions of getting something wrong? Phil Plait seems to have changed his mind after hearing from fellow bloggers about Bialik’s anti-vax views. I don’t even know where her anti-vax views stem from (is it a case like Dr. Oz where his spouse seems to have opened the door to pseudoscience views?).
Some of these views may be caused by hastiness and shorthand/lack of time to think. In an era where we’re awash in information, it is impossible to be informed about everything and yet we’re also too quick to be aghast when people don’t have views or don’t know something. At best, it comes off as enthusiasm you want to impart to someone about a topic. At worst, it’s used as an identity marker to exclude people, even if they’re new enthusiasts for something you’ve been into for years…and get turned out because of newness to something and simply don’t know as much. While I agree enthusiasm only takes you so far, it’s a spark that can carry you into new and unexpected places and shouldn’t be discouraged whoever has deemed themselves a gatekeeper of a community.
There is demand to specialize and yet be a generalist at the same time. And to instantly able to learn and absorb new things. I’m willing to work hard to figure things out, but if I’m given insufficient time to learn what I need to, I’m much more likely to make a mistake (and learning time seems shorter and shorter…and unexamined learning can lead to problems). We’re all encouraged to learn how to learn, and yet that seems hugely insufficient somehow. I am nearly paranoid of missing something critical or leaving some citation out. Of course, it’s not all about what we’re informed about. It’s also true that we develop identities around shared beliefs (‘people like me have this belief, I must think that too’) that can become quite entrenched in communities in which case information alone cannot change someone’s mind, as work by Brendan Nyhan and other’s has shown.
It may be that I’m just worried about something I feel exists but isn’t actually as bad as it seems. However, everywhere I look, there are demands to be up on the latest everything and if not, you’re falling behind the times! Keep up or go away, you can’t compete and so shouldn’t even try. The world is complex and crazy and there is likely more awareness of that than ever. Being humble in the face of that is a virtue in my book. There is likely always more to a story. And just because we’re not always completely informed does not mean we can’t act or put our voices to an idea, but we need to listen to feedback and accept evidence contrary to what we think is going on. All of these mental gymnastics should underscore just how hard it is for scientists to come to strong theories about how the world works and when a scientific consensus is reached, it’s a big deal, and more credible than an individual report alone.
I’ve never had a good cup of instant coffee. I’m not sure that exists. Putting in the work to grind beans, put them through a quality filter, and taking the time to let it steep often makes for a better cup
I am an academic scientist right now, trying to contribute to my field in a meaningful way and not add to the noise of wrong/hasty information that’s out in the world. Patience isn’t a virtue we hear a lot about anymore. The world seems to be more about speed and getting to something first. Instant may be good for some things, but I like to think of it like sources of coffee. I’ve never had a good cup of instant coffee. I’m not sure that exists. Putting in the work to grind beans, put them through a quality filter, and taking the time to let it steep often makes for a better cup (not always). And perhaps due to my (highly) introverted side that likes reflection, writing, and learning before speaking up. And I hope any job I do hold will allow me to do just that, within reason, of course. I am determined to add value wherever I work, and I hope that the skills I gravitate towards/have developed are valued somewhere in the world.
I was re-listening (yes, I do this sometimes w/ things I find great) to an episode of one of my favorite podcasts “Good Job, Brain” (it’s about pub trivia, and trivia, and knowledge and the hosts are amazing, if your’e done w/ Serial…it’s different than that, but give it a listen).
This episode was about the circus. One of the hosts talked about how people that work in the circus and other performing arts were highly superstitious and cited a researcher saying that the people most likely to develop superstitious thinking are those in fields where the people have little control over what happens to them. There are a lot of things that could go wrong at a circus even if you do your own job perfectly. Same with sports, acting, comedy, mime, all that. And it suddenly occurred to me: uncertain environments, little direct control over our futures, funding, and just the chaos of doing research itself might mean scientists are prone to superstition, especially early career ones.
In the life sciences, we pray to PCR Gods, take our pipette tips in certain patterns, and I’m sure more. Of course, scientists don’t like to think we are superstitious perhaps, but it seems like something we may well be prone to given the pressures academics are under these days. Dealing with such seems to result in risk aversion, becoming more insular (i.e. less inclusive of diversity), less willing to ask for the help we need, less willing to leap into the unknown (a problem if you’re trying to figure out a plan B,C,D or E career path), and more obsessive compulsive than usual. So we may evade superstitions, but the same environments may make us more prone to these other issues. I’m not a social scientist so I don’t know how all of these thing interrelate or if they’re separable, but it does make logical sense (or perhaps that’s just confirmation bias).
So let’s do a yes/no/haven’t noticed poll. Reflecting on the current academic climate and how you behave, have you noticed yourself or the scientific community being superstitious?
Terry McGlynn (@hormiga) put in his application for Full Professor recently and wrote about how he described his blogging activity and tried to put it into context for the review committee and describing the benefits he gets out of it, most of which are not tangible, or really “count” by traditional academic metrics. He’s a teacher and a scholar. Productive includes syllabi and publications for the most part.
And I agree that locally, blogging probably has no impact or is seen as a slight negative on the campus where he works. I try to keep my social media and blogging activity under wraps too. I don’t talk about it at work at all.
Except. Here’s the thing with my blog. It’s saved my life.
I don’t have 4,000 hits/month like Dr. McGlynn does, nor have I been a good scholar and published as I should. Though Katie Hinde (@mammals_suck) does nicely lay out the argument for why publishing fewer “real” papers with more rigour and less status-chasing on her own blog.
Also issued today was a National Academies report on the postdoc experience and suggested reforms. There are two posts about it in Science careers here and here (and I’m sure a lot more coverage elsewhere– it’s a big deal for the science world).
Publishing matters. However, I have refused to play the game of chasing prestige. I’d rather do good work that’s correct rather than overhype some result. Of course, as I’ve written, I haven’t been productive. Failed projects, perfectionism, crippling impostorism, clinical depression, have all derailed productivity. Some of that is completely 1000% my fault. Some of it is the system of academia though and the mental health problems it can cause as Melonie Fullick writes (@Qui_oui). Largely, I have managed my mental health problems the last year or so and am in a much better place to actually do something. And this year, in ways that academia would say don’t count, I have.
What has my postdoc experience been? Getting over depression, but also blogging. I don’t have a lot of hits each month, but blogging has helped me build a writing habit and given me opportunities that wouldn’t have existed otherwise. It’s helped me build out my network (mostly on Twitter). It was a way to put my voice out into the world that had no other place to go. If I hadn’t started writing, I honestly think it’s quite likely that I’d have gone the way of Stefan Grimm.
Because of my blog, it’s made me want to stay. To do better. To write more, to learn, explore, connect, and yes, do good science (a manuscript I’ve written will be submitted soon!). None of these things really count in academia though. I know that and beat myself up for it still sometimes that all I do is what anyone else can do: start a blog and type words on a page (the bloggers/writers I follow in fact, by and large all do it better than I do, in my opinion). Blogging has brought me back from the ledge. Perhaps I could have achieved the same ends with a personal journal, but at least my blog is something I wrote, publish and maintain and made a commitment to write on at least once a week.
The National academies report seems useful for anyone just entering grad school or is early on in their postdoc time. For me, it’s cold comfort, but glad it’s out there to further the discussion of the postdoc experience and how it can be better for everyone involved.
So no, my blog doesn’t count, except that it does. It’s the most important thing to me. And I know that no one else probably cares, but it’s an archive of writing samples that I can trot out for discussions I see on Twitter. It’s also led me to new small projects like this:
My next goal is to write more about actual science (I don’t tend to say I want to be a science writer because currently, that seems outlandish somehow– I want to help the enterprise of science, but am still not sure if or where any talent I might have lies). I’m not sure if I’ll do it here or someplace else, but if my “alternate career” can involve writing, count me in.
And even if not, I’ll still find a way to keep writing online about things that interest me like the Twitter discussion I was in earlier today that set off the horrifying thought that any image of a plant and a DNA molecule now signifies GMO, not just a plant (because some may not realize plants have full genomes unto themselves as living beings). Perhaps that’s my next post.